katherine coleman goble johnson
marge__neurips_final_ (2)
MARGE performs comparably to XLM-R, but with significant variation across languages. We only show results for languages in all model's Table 8: Number of documents per language used for pre-training. Katherine G. Johnson (née Coleman; August 26, 1918 - February 24, 2020) was an She contributed to the science of the U.S. Air Force and space programs,
- North America > United States > Virginia > Newport News (0.06)
- North America > United States > West Virginia (0.05)
- Africa > Niger (0.05)
- Government > Space Agency (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- North America > United States > Virginia > Newport News (0.06)
- North America > United States > West Virginia (0.05)
- Africa > Niger (0.05)
- Government > Space Agency (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
Pre-training via Paraphrasing
Lewis, Mike, Ghazvininejad, Marjan, Ghosh, Gargi, Aghajanyan, Armen, Wang, Sida, Zettlemoyer, Luke
We introduce MARGE, a pre-trained sequence-to-sequence model learned with an unsupervised multi-lingual multi-document paraphrasing objective. MARGE provides an alternative to the dominant masked language modeling paradigm, where we self-supervise the reconstruction of target text by retrieving a set of related texts (in many languages) and conditioning on them to maximize the likelihood of generating the original. We show it is possible to jointly learn to do retrieval and reconstruction, given only a random initialization. The objective noisily captures aspects of paraphrase, translation, multi-document summarization, and information retrieval, allowing for strong zero-shot performance on several tasks. For example, with no additional task-specific training we achieve BLEU scores of up to 35.8 for document translation. We further show that fine-tuning gives strong performance on a range of discriminative and generative tasks in many languages, making MARGE the most generally applicable pre-training method to date.
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- North America > United States > Virginia > Newport News (0.04)
- North America > United States > West Virginia (0.04)
- Africa > Niger (0.04)
- Government > Space Agency (0.95)
- Government > Regional Government > North America Government > United States Government (0.95)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Machine Translation (0.70)
- Information Technology > Artificial Intelligence > Natural Language > Information Retrieval (0.66)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.49)